We've all dealt with activation functions while working with neural nets. - Sigmoid - Tanh - ReLu & Leaky ReLu - Gelu Ever wondered why they are so importantโ“๐Ÿค” Let me explain it...

4 machine learning cheatsheets from Stanford (that will save you hours of study):

A problem with large neural networks: They look like a mess. Chaos leads to issues. Here is a technique to introduce clarity ๐Ÿ”ฝ 1/7 https://t.co/MJi55B7FKC

5 facts about Tanh. Tanh is an activation function used in complex Neural Networks. Here are some facts you must know about it. ๐Ÿงต https://t.co/E9HnpC7EpY

5 facts about Tanh. Tanh is an activation function used in complex Neural Networks. Here are some facts you must know about it. ๐Ÿงต https://t.co/Fgd8N0qTMA

We've all dealt with activation functions while working with neural nets. - Sigmoid - Tanh - ReLu & Leaky ReLu - Gelu Ever wondered why they are so importantโ“๐Ÿค” Let me explain it...

We've all dealt with activation functions while working with neural nets. - Sigmoid - Tanh - ReLu & Leaky ReLu - Gelu Ever wondered why they are so importantโ“๐Ÿค” Let me explain it...

AI related Important terms you must know for UPSC Prelims: (Google and just understand meaning of these terms) 1. Generative Pre-training Transformer 2. Machine learning 3. Deep...

Terms related to AI that could be asked in UPSC CSE exam: Artificial Intelligence (AI) Machine Learning (ML) Deep Learning Natural Language Processing (NLP) Neural Networks ChatGP...

Open this if you want to see how ChatGPT built a Deep Learning model starting from nothing.

Neural networks learn to approx functions y = f(x). Inference through a trained network is akin to looking up with query (X), values (in latent space encoded by nodes in hidden la...

5 free ML tutorials from last week by @huggingface, @TensorFlow, and @UberEng. Thread ๐Ÿงต

Open this if you want to see how ChatGPT built all my code.

Day 56 of #60daysOfMachineLearning ๐Ÿ”ท Long Short-Term Memory Neural Networks ๐Ÿ”ท Long Short-Term Memory (LSTM) networks are a type of artificial neural network that is specifically...

Day 54 of #60daysOfMachineLearning ๐Ÿ”ท Feed Forward Neural Networks ๐Ÿ”ท A feedforward neural network is a type of neural network that consists of multiple layers of interconnected ne...

Choosing the right architecture for a neural network is key to the success of your machine learning model. ๐Ÿงต Here are some things to consider when designing neural networks ๐Ÿ‘‡ http...

Day 53 of #60daysOfMachineLearning ๐Ÿ”ท Neural Networks ๐Ÿ”ท A neural network is a computational model that is inspired by the structure and function of the brain. ๐Ÿงต ๐Ÿ‘‡ https://t.co/JE...

FREE #Infographic: #ArtificialIntelligence, #MachineLearning, #NeuralNetworks, and #DeepLearning! #100DaysOfCode #5G #AI #Analytics #BigData #Cloud #Coding #Data #DataScience #Gi...

Here are the pictures of two different problems. Two classes. We want to use a neural network to solve these problems. What's the difference between them? 1 of 8 https://t.co/eQ...

The batch size is one of the most influential parameters in the outcome of a neural network. Here is everything you need to know about the batch size when training a neural networ...

3 papers to understand Time-Series Forecasting โณ better. 1. Time-series Extreme Event Forecasting @UberEng 2. AutoML for Time-Series Forecasting @GoogleAI 3. AR-Net @MetaAI A Th...

Batch Normalization is a technique used when training deep neural networks. But there's something different about it: Nobody agrees on how to use it. This was shocking to me! 1...

Activations functions are a fundamental concept in machine learning. But most people reading this don't know how Sigmoid and Tanh differ. Here is what you are missing: 1 of 9

The Transformer is a magnificient neural network architecture because it is a general-purpose differentiable computer. It is simultaneously: 1) expressive (in the forward pass) 2)...